Approximating Bayesian inference by weighted likelihood

نویسنده

  • Xiaogang WANG
چکیده

The author proposes to use weighted likelihood to approximate Bayesian inference when no external or prior information is available. He proposes a weighted likelihood estimator that minimizes the empirical Bayes risk under relative entropy loss. He discusses connections among the weighted likelihood, empirical Bayes and James–Stein estimators. Both simulated and real data sets are used for illustration purposes. Approximer l’inférence bayésienne par la vraisemblance pondérée Résumé : L’auteur propose l’emploi de la vraisemblance pondérée pour approximer l’inférence bayésienne en l’absence d’information externe ou a priori. Il propose un estimateur de vraisemblance pondérée qui minimise le risque de Bayes empirique sous l’entropie relative. Il établit des liens entre les estimateurs de vraisemblance pondérée, de Bayes empirique et de James–Stein. Des jeux de données réelles et simulées illustrent son propos.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Pseudo-Likelihood Inference Underestimates Model Uncertainty: Evidence from Bayesian Nearest Neighbours

When using the K-nearest neighbours (KNN) method, one often ignores the uncertainty in the choice of K. To account for such uncertainty, Bayesian KNN (BKNN) has been proposed and studied (Holmes and Adams 2002 Cucala et al. 2009). We present some evidence to show that the pseudo-likelihood approach for BKNN, even after being corrected by Cucala et al. (2009), still significantly underest...

متن کامل

A Bayesian Nominal Regression Model with Random Effects for Analysing Tehran Labor Force Survey Data

Large survey data are often accompanied by sampling weights that reflect the inequality probabilities for selecting samples in complex sampling. Sampling weights act as an expansion factor that, by scaling the subjects, turns the sample into a representative of the community. The quasi-maximum likelihood method is one of the approaches for considering sampling weights in the frequentist framewo...

متن کامل

Triply fuzzy function approximation for hierarchical Bayesian inference

We prove that three independent fuzzy systems can uniformly approximate Bayesian posterior probability density functions by approximating the prior and likelihood probability densities as well as the hyperprior probability densities that underly the priors. This triply fuzzy function approximation extends the recent theorem for uniformly approximating the posterior density by approximating just...

متن کامل

Inference for the Type-II Generalized Logistic Distribution with Progressive Hybrid Censoring

This article presents the analysis of the Type-II hybrid progressively censored data when the lifetime distributions of the items follow Type-II generalized logistic distribution. Maximum likelihood estimators (MLEs) are investigated for estimating the location and scale parameters. It is observed that the MLEs can not be obtained in explicit forms. We provide the approximate maximum likelihood...

متن کامل

An Introduction to Inference and Learning in Bayesian Networks

Bayesian networks (BNs) are modern tools for modeling phenomena in dynamic and static systems and are used in different subjects such as disease diagnosis, weather forecasting, decision making and clustering. A BN is a graphical-probabilistic model which represents causal relations among random variables and consists of a directed acyclic graph and a set of conditional probabilities. Structure...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006